1,741 research outputs found

    Video advertisement mining for predicting revenue using random forest

    Get PDF
    Shaken by the threat of financial crisis in 2008, industries began to work on the topic of predictive analytics to efficiently control inventory levels and minimize revenue risks. In this third-generation age of web-connected data, organizations emphasized the importance of data science and leveraged the data mining techniques for gaining a competitive edge. Consider the features of Web 3.0, where semantic-oriented interaction between humans and computers can offer a tailored service or product to meet consumers\u27 needs by means of learning their preferences. In this study, we concentrate on the area of marketing science to demonstrate the correlation between TV commercial advertisements and sales achievement. Through different data mining and machine-learning methods, this research will come up with one concrete and complete predictive framework to clarify the effects of word of mouth by using open data sources from YouTube. The uniqueness of this predictive model is that we adopt the sentiment analysis as one of our predictors. This research offers a preliminary study on unstructured marketing data for further business use

    Learning many-body Hamiltonians with Heisenberg-limited scaling

    Full text link
    Learning a many-body Hamiltonian from its dynamics is a fundamental problem in physics. In this work, we propose the first algorithm to achieve the Heisenberg limit for learning an interacting NN-qubit local Hamiltonian. After a total evolution time of O(ϵ−1)\mathcal{O}(\epsilon^{-1}), the proposed algorithm can efficiently estimate any parameter in the NN-qubit Hamiltonian to ϵ\epsilon-error with high probability. The proposed algorithm is robust against state preparation and measurement error, does not require eigenstates or thermal states, and only uses polylog(ϵ−1)\mathrm{polylog}(\epsilon^{-1}) experiments. In contrast, the best previous algorithms, such as recent works using gradient-based optimization or polynomial interpolation, require a total evolution time of O(ϵ−2)\mathcal{O}(\epsilon^{-2}) and O(ϵ−2)\mathcal{O}(\epsilon^{-2}) experiments. Our algorithm uses ideas from quantum simulation to decouple the unknown NN-qubit Hamiltonian HH into noninteracting patches, and learns HH using a quantum-enhanced divide-and-conquer approach. We prove a matching lower bound to establish the asymptotic optimality of our algorithm.Comment: 11 pages, 1 figure + 27-page appendi

    Learning to predict arbitrary quantum processes

    Full text link
    We present an efficient machine learning (ML) algorithm for predicting any unknown quantum process E\mathcal{E} over nn qubits. For a wide range of distributions D\mathcal{D} on arbitrary nn-qubit states, we show that this ML algorithm can learn to predict any local property of the output from the unknown process E\mathcal{E}, with a small average error over input states drawn from D\mathcal{D}. The ML algorithm is computationally efficient even when the unknown process is a quantum circuit with exponentially many gates. Our algorithm combines efficient procedures for learning properties of an unknown state and for learning a low-degree approximation to an unknown observable. The analysis hinges on proving new norm inequalities, including a quantum analogue of the classical Bohnenblust-Hille inequality, which we derive by giving an improved algorithm for optimizing local Hamiltonians. Overall, our results highlight the potential for ML models to predict the output of complex quantum dynamics much faster than the time needed to run the process itself.Comment: 10 pages, 1 figure + 38-page appendix; v2: Added a figure and fixed a minor formatting issu
    • …
    corecore